On DNF Approximators for Monotone Boolean Functions
نویسندگان
چکیده
We study the complexity of approximating monotone Boolean functions with disjunctive normal form (DNF) formulas, exploring two main directions. First, we construct DNF approximators for arbitrary monotone functions achieving one-sided error: we show that every monotone f can be ε-approximated by a DNF g of size 2n−Ωε( √ n) satisfying g(x) ≤ f(x) for all x ∈ {0, 1}. This is the first non-trivial universal upper bound even for DNF approximators incurring two-sided error. Next, we study the power of negations in DNF approximators for monotone functions. We exhibit monotone functions for which non-monotone DNFs perform better than monotone ones, giving separations with respect to both DNF size and width. Our results, when taken together with a classical theorem of Quine [1], highlight an interesting contrast between approximation and exact computation in the DNF complexity of monotone functions, and they add to a line of work on the surprising role of negations in monotone complexity [2,3,4].
منابع مشابه
A Solution of the P versus NP Problem
Berg and Ulfberg and Amano and Maruoka have used CNF-DNF-approximators to prove exponential lower bounds for the monotone network complexity of the clique function and of Andreev's function. We show that these approximators can be used to prove the same lower bound for their non-monotone network complexity. This implies P not equal NP.
متن کاملProbabilistic Construction of Monotone Formulae for Positive Linear Threshold Functions
We extend Valiant's construction of monotone formulae for the majority function to obtain an eecient probabilistic construction of small monotone formulae for arbitrary positive linear threshold functions. We show that any positive linear threshold function on n boolean variables which has weight complexity q(n) can be computed by a monotone boolean formula of size O(q(n) 3:3 n 2): Our techniqu...
متن کاملLearning of Boolean Functions Using Support Vector Machines
This paper concerns the design of a Support Vector Machine (SVM) appropriate for the learning of Boolean functions. This is motivated by the need of a more sophisticated algorithm for classification in discrete attribute spaces. Classification in discrete attribute spaces is reduced to the problem of learning Boolean functions from examples of its input/output behavior. Since any Boolean functi...
متن کاملAnalysis of Case-Based Representability of Boolean Functions by Monotone Theory
Classi cation is one of major tasks in case-based reasoning(CBR) and many studies have been done for analyzing properties of case-based classi cation [1, 14, 10, 15, 12, 9, 13, 7]. However, these studies only consider numerical similarity measures whereas there are other kinds of similarity measure for di erent tasks. Among these measures, HYPO system [2, 3] in a legal domain uses a similarity ...
متن کاملAlmost all monotone Boolean functions are polynomially learnable using membership queries
We consider exact learning or identification of monotone Boolean functions by only using membership queries. It is shown that almost all monotone Boolean functions are polynomially identifiable in the input number of variables as well as the output being the sum of the sizes of the CNF and DNF representations. 2001 Elsevier Science B.V. All rights reserved.
متن کامل